Data quality — Data are of high quality if they are fit for their intended uses in operations, decision making and planning (J. M. Juran). Alternatively, the data are deemed of high quality if they correctly represent the real world construct to which they… … Wikipedia
Data migration — is the process of transferring data between storage types, formats, or computer systems. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. It is required when… … Wikipedia
Data Validation and Certification Server — (DVCS) is a public key infrastructure or PKI service providing data validation services, asserting correctness of digitally signed documents, validity of public key certificates and possession or existence of data. In practical applications DVCS… … Wikipedia
Data feed — is a mechanism for users to receive updated data from data sources. It is commonly used by real time applications in point to point settings as well as on the world wide web. The latter is also called Web feed. News feed is a popular form of Web… … Wikipedia
Data erasure — (also called data clearing or data wiping) is a software based method of overwriting data that completely destroys all electronic data residing on a hard disk drive or other digital media. Permanent data erasure goes beyond basic file deletion… … Wikipedia
Data stream mining — is the process of extracting knowledge structures from continuous, rapid data records. A data stream is an ordered sequence of instances that in many applications of data stream mining can be read only once or a small number of times using… … Wikipedia
Data collection — is a term used to describe a process of preparing and collecting data, for example, as part of a process improvement or similar project. The purpose of data collection is to obtain information to keep on record, to make decisions about important… … Wikipedia
Data Format Description Language — (DFDL, often pronounced daff o dil) is a modeling language from the Open Grid Forum for describing general text and binary data. A DFDL model or schema allows any text or binary data to be read (or parsed ) from its native format and to be… … Wikipedia
Data Validation — [engl.], Datenüberprüfung … Universal-Lexikon
Data verification — is a process wherein the data is checked for accuracy and inconsistencies after data migration is done.[1] It helps to determine whether data was accurately translated when data is transported from one source to another, is complete, and supports … Wikipedia
Data General AOS — Data General wrote operating systems for its hardware: DOS and RDOS for the 16 bit Nova line; RDOS and AOS for the 16 bit Eclipse C, M and S lines; AOS/VS and AOS/RT32[1] (1980) and later AOS/VS II (1988) for the 32 bit Eclipse MV line. A… … Wikipedia